Kernel least mean square algorithm with constrained growth
نویسندگان
چکیده
The linear least mean squares (LMS) algorithm has been recently extended to a reproducing kernel Hilbert space, resulting in an adaptive filter built from a weighted sum of kernel functions evaluated at each incoming data sample. With time, the size of the filter as well as the computation and memory requirements increase. In this paper, we shall propose a new efficient methodology for constraining the increase in length of a radial basis function (RBF) network resulting from the kernel LMS algorithm without significant sacrifice on performance. The method involves sequential Gaussian elimination steps on the Gram matrix to test the linear dependency of the feature vector corresponding to each new input with all the previous feature vectors. This gives an efficient method of continuing the learning as well as restricting the number of kernel functions used. & 2008 Elsevier B.V. All rights reserved.
منابع مشابه
Kernel Least Mean Square Algorithm
A simple, yet powerful, learning method is presented by combining the famed kernel trick and the least-mean-square (LMS) algorithm, called the KLMS. General properties of the KLMS algorithm are demonstrated regarding its well-posedness in very high dimensional spaces using Tikhonov regularization theory. An experiment is studied to support our conclusion that the KLMS algorithm can be readily u...
متن کاملMean square convergence analysis for kernel least mean square algorithm
In this paper, we study the mean square convergence of the kernel least mean square (KLMS). The fundamental energy conservation relation has been established in feature space. Starting from the energy conservation relation, we carry out the mean square convergence analysis and obtain several important theoretical results, including an upper bound on step size that guarantees the mean square con...
متن کاملMean-Square Performance of the Constrained LMS Algorithm
—The so-called constrained least mean-square algorithm is one of the most commonly used linear-equality-constrained adaptive filtering algorithms. Its main advantages are adaptability and relative simplicity. In order to gain theoretical insights into the performance of this algorithm, we examine its mean-square convergence and derive an expression for its steady-state mean-square deviation. Ou...
متن کاملConstrained Least Mean Logarithmic Square Algorithm: Design and Performance Analysis
This paper introduces a novel constraint adaptive filtering algorithm based on a relative logarithmic cost function which is termed as Constrained Least Mean Logarithmic Square (CLMLS). The proposed CLMLS algorithm elegantly adjusts the cost function based on the amount of error thereby achieves better performance compared to the conventional Constrained LMS (CLMS) algorithm. With no assumption...
متن کاملFixed budget quantized kernel least-mean-square algorithm
We present a quantization-based kernel least mean square (QKLMS) algorithm with a fixed memory budget. In order to deal with the growing support inherent in online kernel methods, the proposed method utilizes a growing and pruning combined technique and defines a criterion, significance, based on weighted statistical contribution of a data center. This method doesn’t need any apriori informatio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Signal Processing
دوره 89 شماره
صفحات -
تاریخ انتشار 2009